36,009 research outputs found

    Intima-Media Thickness: Setting a Standard for a Completely Automated Method of Ultrasound Measurement

    Get PDF
    The intima - media thickness (IMT) of the common carotid artery is a widely used clinical marker of severe cardiovascular diseases. IMT is usually manually measured on longitudinal B-Mode ultrasound images. Many computer-based techniques for IMT measurement have been proposed to overcome the limits of manual segmentation. Most of these, however, require a certain degree of user interaction. In this paper we describe a new completely automated layers extraction (CALEXia) technique for the segmentation and IMT measurement of carotid wall in ultrasound images. CALEXia is based on an integrated approach consisting of feature extraction, line fitting, and classification that enables the automated tracing of the carotid adventitial walls. IMT is then measured by relying on a fuzzy K-means classifier. We tested CALEXia on a database of 200 images. We compared CALEXia performances to those of a previously developed methodology that was based on signal analysis (CULEXsa). Three trained operators manually segmented the images and the average profiles were considered as the ground truth. The average error from CALEXia for lumen - intima (LI) and media - adventitia (MA) interface tracings were 1.46 ± 1.51 pixel (0.091 ± 0.093 mm) and 0.40 ± 0.87 pixel (0.025 ± 0.055 mm), respectively. The corresponding errors for CULEXsa were 0.55 ± 0.51 pixels (0.035 ± 0.032 mm) and 0.59 ± 0.46 pixels (0.037 ± 0.029 mm). The IMT measurement error was equal to 0.87 ± 0.56 pixel (0.054 ± 0.035 mm) for CALEXia and 0.12 ± 0.14 pixel (0.01 ± 0.01 mm) for CULEXsa. Thus, CALEXia showed limited performance in segmenting the LI interface, but outperformed CULEXsa in the MA interface and in the number of images correctly processed (10 for CALEXia and 16 for CULEXsa). Based on two complementary strategies, we anticipate fusing them for further IMT improvement

    Improving information filtering via network manipulation

    Get PDF
    Recommender system is a very promising way to address the problem of overabundant information for online users. Though the information filtering for the online commercial systems received much attention recently, almost all of the previous works are dedicated to design new algorithms and consider the user-item bipartite networks as given and constant information. However, many problems for recommender systems such as the cold-start problem (i.e. low recommendation accuracy for the small degree items) are actually due to the limitation of the underlying user-item bipartite networks. In this letter, we propose a strategy to enhance the performance of the already existing recommendation algorithms by directly manipulating the user-item bipartite networks, namely adding some virtual connections to the networks. Numerical analyses on two benchmark data sets, MovieLens and Netflix, show that our method can remarkably improve the recommendation performance. Specifically, it not only improve the recommendations accuracy (especially for the small degree items), but also help the recommender systems generate more diverse and novel recommendations.Comment: 6 pages, 5 figure

    Identifying Proteins of High Designability via Surface-Exposure Patterns

    Full text link
    Using an off-lattice model, we fully enumerate folded conformations of polypeptide chains of up to N = 19 monomers. Structures are found to differ markedly in designability, defined as the number of sequences with that structure as a unique lowest-energy conformation. We find that designability is closely correlated with the pattern of surface exposure of the folded structure. For longer chains, complete enumeration of structures is impractical. Instead, structures can be randomly sampled, and relative designability estimated either from designability within the random sample, or directly from surface-exposure pattern. We compare the surface-exposure patterns of those structures identified as highly designable to the patterns of naturally occurring proteins.Comment: 17 pages, 12 figure

    B→KB\to K Transition Form Factor with Tensor Current within the kTk_T Factorization Approach

    Full text link
    In the paper, we apply the kTk_T factorization approach to deal with the B→KB\to K transition form factor with tensor current in the large recoil regions. Main uncertainties for the estimation are discussed and we obtain FTB→K(0)=0.25±0.01±0.02F_T^{B\to K}(0)=0.25\pm0.01\pm0.02, where the first error is caused by the uncertainties from the pionic wave functions and the second is from that of the B-meson wave functions. This result is consistent with the light-cone sum rule results obtained in the literature.Comment: 8 pages, 4 figures, references adde

    Challenges of Primary Frequency Control and Benefits of Primary Frequency Response Support from Electric Vehicles

    Get PDF
    As the integration of wind generation displaces conventional plants, system inertia provided by rotating mass declines, causing concerns over system frequency stability. This paper implements an advanced stochastic scheduling model with inertia-dependent fast frequency response requirements to investigate the challenges on the primary frequency control in the future Great Britain electricity system. The results suggest that the required volume and the associated cost of primary frequency response increase significantly along with the increased capacity of wind plants. Alternative measures (e.g. electric vehicles) have been proposed to alleviate these concerns. Therefore, this paper also analyses the benefits of primary frequency response support from electric vehicles in reducing system operation cost, wind curtailment and carbon emissions

    The Distribution of the Asymptotic Number of Citations to Sets of Publications by a Researcher or From an Academic Department Are Consistent With a Discrete Lognormal Model

    Full text link
    How to quantify the impact of a researcher's or an institution's body of work is a matter of increasing importance to scientists, funding agencies, and hiring committees. The use of bibliometric indicators, such as the h-index or the Journal Impact Factor, have become widespread despite their known limitations. We argue that most existing bibliometric indicators are inconsistent, biased, and, worst of all, susceptible to manipulation. Here, we pursue a principled approach to the development of an indicator to quantify the scientific impact of both individual researchers and research institutions grounded on the functional form of the distribution of the asymptotic number of citations. We validate our approach using the publication records of 1,283 researchers from seven scientific and engineering disciplines and the chemistry departments at the 106 U.S. research institutions classified as "very high research activity". Our approach has three distinct advantages. First, it accurately captures the overall scientific impact of researchers at all career stages, as measured by asymptotic citation counts. Second, unlike other measures, our indicator is resistant to manipulation and rewards publication quality over quantity. Third, our approach captures the time-evolution of the scientific impact of research institutions.Comment: 20 pages, 11 figures, 3 table
    • …
    corecore